跳到主要内容

TinyML on SEEED XIAO RP2040 (Motion Recognition)

In this wiki, we will show you how to utilize the accelerometer on Seeed Studio XIAO RP2040 combined with Edge Impulse to enable motion recognition. The codes we present here are supported by latest version of XIAO RP2040 Boards.

Materials Required

Hardware

In this wiki, we need to prepare the following materials:

pir

Hardware Set up

pir

Software

The required libraries are listed below. It is highly recommanded that use the codes here to check whether the hardware is functioning well. If you have problem about installing the library, please refer to here.

Get started

First we are going run some demos to check whether the board and the display screen is functioning well. If yours are fine, you can move on to the next instruction.

Check the circuit connection and accelerometer

Open the Arduino IDE, navigate to Sketch -> Include Library -> Manage Libraries... and Search and Install U8g2 library in the Library Manager.

pir

After the installation, copy the following code run it.

#include <Wire.h>
#include "MMA7660.h"
MMA7660 accelemeter;
#define CONVERT_G_TO_MS2 9.80665f

void setup() {
Serial.begin(115200);
while (!Serial);
accelemeter.init();
}


void loop() {

float ax, ay, az;
accelemeter.getAcceleration(&ax, &ay, &az);

Serial.print(ax * CONVERT_G_TO_MS2,4);
Serial.print('\t');
Serial.print(ay * CONVERT_G_TO_MS2,4);
Serial.print('\t');
Serial.println(az * CONVERT_G_TO_MS2,4);

}

After uploading the code and unplugging Seeed Studio XIAO RP2040 . Then , open the serial monitor you will see the output like this:

pir

If all is fine, we can move on and connect Seeed Studio XIAO RP2040 to Edge Impulse.

Connected with Edge Impulse

The precision of the training model is very important to the final result. If your output training results are as low as less than 65%, we highly recommand you train for more times or add more data .

pir

  • Step 2. Choose "Accelerometer data" and click "Let’s get started!"

pir

pir

  • Step 3. Install Edge Impulse CLI in your computer.

  • Step 4. Run the command in your terminal or cmd or powershell to start it.

sudo edge-impulse-data-forwarder
  • Step 5. We need to use the CLI to connect the Seeed Studio XIAO RP2040 with Edge Impulse. First, login your account and choose your project

Name the accelerometer and the device.

Move back to Edge Impulse "Data acquisition" page, the outcome should be like this if the connection is successful. You can find the Device of "XIAO RP2040" is shown on the right of the page.

pir

  • Step 6. Select the sensor as "3 axes". Name your label as up and down, modify Sample length (ms.) to 20000 and click start sampling.

pir

  • Step 7. Swing the Seeed Studio XIAO RP2040 up and down and keep the motion for 20 seconds. You can find the acquistion is shown up like this:

pir

  • Step 8. Split the data by clicking the raw data right top and choose "Split Sample". Click +Add Segment and then click the graph. Repeat it more than 20 time to add segments. Click Split and you will see the the sample data each for 1 second.

pir

  • Step 9. Repeat Step 7. and Step 8. and label data with different name to click different motion data, like circle and line and so on. The example provided is classifying up and down, left and right, and circle. You can change it as you may want to change here.
备注

In Step 8. the split time is 1 second which means you at least do one swing of up and down in one second in Step 7. Otherwise, the results will not be accurate. Meanwhile, you can adjust the split time according to your own motion speed.

  • Step 10. Create Impulse

Click Create impulse -> Add a processing block -> Choose Spectral Analysis -> Add a learning block -> Choose Classification (Keras) -> Save Impulse

pir

  • Step 11. Spectral features

Click and Set up

Click Spectral features -> Drop down page to click Save parameters -> Click Generate features

The output page should be like:

pir

pir

  • Step 12. Training your model

Click NN Classifier -> Click Start training -> Choose Unoptimized (float32)

pir

  • Step 13. Model testing

Click Model testing -> Click Classify all

If your accuracy is low, you can check you dataset by increasing the training set and extending the sample time

We are also able to get the evaluation when downloading the model

pir

  • Step 14. Build Arduino library

Click Deployment -> Click Arduino Library -> Click Build -> Download the .ZIP file

pir

  • Step 15. The name of .ZIP file is very important, it is set up as your name of the Edge Impulse project by default. Like here the project of the name is "RP2040". Select the file as ""Add the ".ZIP file" to your Arduino libraries

pir

  • Step 16. Open Arduino -> Click Sketch -> Click Include Library -> ADD .ZIP Library

Copy the code below, if the project name on edgeimpluse is customised, then the zip archive text will be the same name. You can change the first line of the include to your header file.

#include <XIAO_RP2040_inferencing.h> // customed name need change this header file to your own file name
#include <Wire.h>
#include "MMA7660.h"
MMA7660 accelemeter;

#define CONVERT_G_TO_MS2 9.80665f
#define MAX_ACCEPTED_RANGE 2.0f

static bool debug_nn = false;

void setup()
{
Serial.begin(115200);
while (!Serial);
Serial.println("Edge Impulse Inferencing Demo");
accelemeter.init();
}

float ei_get_sign(float number) {
return (number >= 0.0) ? 1.0 : -1.0;
}

void loop()
{
ei_printf("\nStarting inferencing in 2 seconds...\n");

delay(2000);

ei_printf("Sampling...\n");

// Allocate a buffer here for the values we'll read from the IMU
float buffer[EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE] = { 0 };

for (size_t ix = 0; ix < EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE; ix += 3) {
// Determine the next tick (and then sleep later)
uint64_t next_tick = micros() + (EI_CLASSIFIER_INTERVAL_MS * 1000);
accelemeter.getAcceleration(&buffer[ix], &buffer[ix + 1], &buffer[ix + 2]);

for (int i = 0; i < 3; i++) {
if (fabs(buffer[ix + i]) > MAX_ACCEPTED_RANGE) {
buffer[ix + i] = ei_get_sign(buffer[ix + i]) * MAX_ACCEPTED_RANGE;
}
}

buffer[ix + 0] *= CONVERT_G_TO_MS2;
buffer[ix + 1] *= CONVERT_G_TO_MS2;
buffer[ix + 2] *= CONVERT_G_TO_MS2;

delayMicroseconds(next_tick - micros());
}

// Turn the raw buffer in a signal which we can the classify
signal_t signal;
int err = numpy::signal_from_buffer(buffer, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, &signal);
if (err != 0) {
ei_printf("Failed to create signal from buffer (%d)\n", err);
return;
}

// Run the classifier
ei_impulse_result_t result = { 0 };

err = run_classifier(&signal, &result, debug_nn);
if (err != EI_IMPULSE_OK) {
ei_printf("ERR: Failed to run classifier (%d)\n", err);
return;
}

// print the predictions
ei_printf("Predictions ");
ei_printf("(DSP: %d ms., Classification: %d ms., Anomaly: %d ms.)",
result.timing.dsp, result.timing.classification, result.timing.anomaly);
ei_printf(": \n");
for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
ei_printf(" %s: %.5f\n", result.classification[ix].label, result.classification[ix].value);
}
#if EI_CLASSIFIER_HAS_ANOMALY == 1
ei_printf(" anomaly score: %.3f\n", result.anomaly);
#endif

}

pir

  • Step 17. Move or hold the Seeed Studio XIAO RP2040 and check the results:

Click the monitor on the top right corner of Arduino.

When you move the Seeed Studio XIAO RP2040 in the circle and line direction:

The monitor will output something like:

15:45:45.434 -> 
15:45:45.434 -> Starting inferencing in 2 seconds...
15:45:47.414 -> Sampling...
15:45:48.439 -> Predictions (DSP: 6 ms., Classification: 1 ms., Anomaly: 0 ms.):
15:45:48.439 -> Circle: 0.59766
15:45:48.439 -> line: 0.40234
15:45:48.439 ->

Congratulation! You acheve the end of the project. It is encouraged that you can try more directions and check which one will perform the best output.

Resources

Tech Support

Please do not hesitate to submit the issue into our forum.


Loading Comments...